Structured Sparsity: from Mixed Norms to Structured Shrinkage

نویسنده

  • M. Kowalski
چکیده

Sparse and structured signal expansions on dictionaries can be obtained through explicit modeling in the coefficient domain. The originality of the present contribution lies in the construction and the study of generalized shrinkage operators, whose goal is to identify structured significance maps. These generalize Group LASSO and the previously introduced Elitist LASSO by introducing more flexibility in the coefficient domain modeling. We study experimentally the performances of corresponding shrinkage operators in terms of significance map estimation in the orthogonal basis case. We also study their performance in the overcomplete situation, using iterative thresholding.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparsity and persistence: mixed norms provide simple signal models with dependent coefficients

Sparse regression often uses `p norm priors (with p < 2). This paper demonstrates that the introduction of mixed-norms in such contexts allows one to go one step beyond in signal models, and promote some different, structured, forms of sparsity. It is shown that the particular case of the `1,2 and `2,1 norms leads to new group shrinkage operators. Mixed norm priors are shown to be particularly ...

متن کامل

STRUCTURED VARIABLE SELECTION WITH SPARSITY-INDUCING NORMS Structured Variable Selection with Sparsity-Inducing Norms

We consider the empirical risk minimization problem for linear supervised learning, with regularization by structured sparsity-inducing norms. These are defined as sums of Euclidean norms on certain subsets of variables, extending the usual l1-norm and the group l1-norm by allowing the subsets to overlap. This leads to a specific set of allowed nonzero patterns for the solutions of such problem...

متن کامل

Wavelet shrinkage using adaptive structured sparsity constraints

Structured sparsity approaches have recently received much attention in the statistics, machine learning, and signal processing communities. A common strategy is to exploit or assume prior information about structural dependencies inherent in the data; the solution is encouraged to behave as such by the inclusion of an appropriate regularization term which enforces structured sparsity constrain...

متن کامل

Sparse Regression Using Mixed Norms

Mixed norms are used to exploit in an easy way, both structure and sparsity in the framework of regression problems, and introduce implicitly couplings between regression coefficients. Regression is done through optimization problems, and corresponding algorithms are described and analyzed. Beside the classical sparse regression problem, multilayered expansion on unions of dictionaries of signa...

متن کامل

An Inequality with Applications to Structured Sparsity and Multitask Dictionary Learning

From concentration inequalities for the suprema of Gaussian or Rademacher processes an inequality is derived. It is applied to sharpen existing and to derive novel bounds on the empirical Rademacher complexities of unit balls in various norms appearing in the context of structured sparsity and multitask dictionary learning or matrix factorization. A key role is played by the largest eigenvalue ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009